Reduced Basis Methods: From Low-Rank Matrices to Low-Rank Tensors
نویسندگان
چکیده
Abstract. We propose a novel combination of the reduced basis method with low-rank tensor techniques for the efficient solution of parameter-dependent linear systems in the case of several parameters. This combination, called rbTensor, consists of three ingredients. First, the underlying parameter-dependent operator is approximated by an explicit affine representation in a low-rank tensor format. Second, a standard greedy strategy is used to construct a problem-dependent reduced basis. Third, the associated reduced parametric system is solved for all parameter values on a tensor grid simultaneously via a low-rank approach. This allows us to explicitly represent and store an approximate solution for all parameter values at a time. Once this approximation is available, the computation of output functionals and the evaluation of statistics of the solution becomes a cheap online task, without requiring the solution of a linear system.
منابع مشابه
Robust Low-Rank Modelling on Matrices and Tensors
Robust low-rank modelling has recently emerged as a family of powerful methods for recovering the low-dimensional structure of grossly corrupted data, and has become successful in a wide range of applications in signal processing and computer vision. In principle, robust low-rank modelling focuses on decomposing a given data matrix into a low-rank and a sparse component, by minimising the rank ...
متن کاملDiscretized Dynamical Low-Rank Approximation in the Presence of Small Singular Values
Low-rank approximations to large time-dependent matrices and tensors are the subject of this paper. These matrices and tensors are either given explicitly or are the unknown solutions of matrix and tensor differential equations. Based on splitting the orthogonal projection onto the tangent space of the low-rank manifold, novel time integrators for obtaining approximations by low-rank matrices a...
متن کاملDiscretized Dynamical Low-rank Approximation in the Presence of Small Singular
Low-rank approximations to large time-dependent matrices and tensors are the subject of this paper. These matrices and tensors either are given explicitly or are the unknown solutions of matrix and tensor differential equations. Based on splitting the orthogonal projection onto the tangent space of the low-rank manifold, novel time integrators for obtaining approximations by low-rank matrices a...
متن کاملDynamical Approximation by Hierarchical Tucker and Tensor-Train Tensors
We extend results on the dynamical low-rank approximation for the treatment of time-dependent matrices and tensors (Koch & Lubich, 2007 and 2010) to the recently proposed Hierarchical Tucker tensor format (HT, Hackbusch & Kühn, 2009) and the Tensor Train format (TT, Oseledets, 2011), which are closely related to tensor decomposition methods used in quantum physics and chemistry. In this dynamic...
متن کاملDynamical approximation of hierarchical Tucker and tensor-train tensors
We extend results on the dynamical low-rank approximation for the treatment of time-dependent matrices and tensors (Koch & Lubich, 2007 and 2010) to the recently proposed Hierarchical Tucker tensor format (HT, Hackbusch & Kühn, 2009) and the Tensor Train format (TT, Oseledets, 2011), which are closely related to tensor decomposition methods used in quantum physics and chemistry. In this dynamic...
متن کاملذخیره در منابع من
با ذخیره ی این منبع در منابع من، دسترسی به آن را برای استفاده های بعدی آسان تر کنید
عنوان ژورنال:
- SIAM J. Scientific Computing
دوره 38 شماره
صفحات -
تاریخ انتشار 2016